An information approach to regularization parameter selection under model misspecification

نویسندگان

  • A M Urmanov
  • R E Uhrig
چکیده

We review the information approach to regularization parameter selection and its information complexity extension for the solution of discrete ill posed problems. An information criterion for regularization parameter selection was first proposed by Shibata in the context of ridge regression as an extension of Takeuchi’s information criterion. In the information approach, the regularization parameter value is chosen to maximize the mean expected log likelihood (MELL) of a model whose parameters are estimated using the maximum penalized likelihood method. Under the Gaussian noise assumption such a choice coincides with the minimum of mean predictive error choice. Maximization of the MELL corresponds to minimization of the mean Kullback– Leibler information, that measures the deviation of the approximating (model) distribution from the true one. The resulting regularization parameter selection methods can handle possible functional and distributional misspecifications when the usual assumptions of Gaussian noise and/or linear relationship have been made but not met. We also suggest that in engineering applications it is beneficial to find ways of lowering the risk of getting grossly under-regularized solutions and that the new information complexity regularization parameter selection method (RPSM) is one of the possibilities. Several examples of applying the reviewed RPSMs are given. (Some figures in this article are in colour only in the electronic version)

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Information Complexity-Based Regularization Parameter Selection for Solution of Ill-Conditioned Inverse Problems

We propose an information complexity-based regularization parameter selection method for solution of ill-conditioned inverse problems. The regularization parameter is selected to be the minimizer of the Kullback-Leibler (KL) distance between the unknown data-generating distribution and the fitted distribution. The KL distance is approximated by an information complexity (ICOMP) criterion develo...

متن کامل

Automatic estimation of regularization parameter by active constraint balancing method for 3D inversion of gravity data

Gravity data inversion is one of the important steps in the interpretation of practical gravity data. The inversion result can be obtained by minimization of the Tikhonov objective function. The determination of an optimal regularization parameter is highly important in the gravity data inversion. In this work, an attempt was made to use the active constrain balancing (ACB) method to select the...

متن کامل

Regularization Parameter Selection in the Group Lasso

This article discusses the problem of choosing a regularization parameter in the group Lasso proposed by Yuan and Lin (2006), an l1-regularization approach for producing a block-wise sparse model that has been attracted a lot of interests in statistics, machine learning, and data mining. It is important to choose an appropriate regularization parameter from a set of candidate values, because it...

متن کامل

Supplier Selection in Grey Environment: A Grey, AHP, Bulls-Eye and ELECTRE Approach

In recent years, the problem of selecting and evaluating the suppliers in supply chain management has aroused considerable interest in business firms. Owing to the development of information systems, reaching an appropriate decision for adopting discrete methods is a need. The researchers intend to present a new model in this paper as a contributing factor in the grey environment in which the r...

متن کامل

Stability Approach to Regularization Selection (StARS) for High Dimensional Graphical Models

A challenging problem in estimating high-dimensional graphical models is to choose the regularization parameter in a data-dependent way. The standard techniques include K-fold cross-validation (K-CV), Akaike information criterion (AIC), and Bayesian information criterion (BIC). Though these methods work well for low-dimensional problems, they are not suitable in high dimensional settings. In th...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2002